By expanding the vocabulary, selecting data, and employing a multi-stage training strategy, the Chinese version of LLaMA2 was trained in 15 hours at a cost of a few thousand yuan. The performance of LLaMA2 on various Chinese tasks has significantly improved, reaching an advanced level compared to models of the same scale. The building process, code, and weights are all open source, making it transferable to other languages and fields. Achieve low-cost large model training by training the Chinese version of LLaMA2 in just 15 hours with a cost of a few thousand yuan. All training code and pre-trained weights are open source and can be directly applied to other languages and fields for rapid low-cost large model training.